Complexity Analysis of Second-order Line-search Algorithms for Smooth Nonconvex Optimization∗

نویسندگان

  • CLÉMENT W. ROYER
  • STEPHEN J. WRIGHT
چکیده

There has been much recent interest in finding unconstrained local minima of smooth functions, due in part of the prevalence of such problems in machine learning and robust statistics. A particular focus is algorithms with good complexity guarantees. Second-order Newton-type methods that make use of regularization and trust regions have been analyzed from such a perspective. More recent proposals, based chiefly on first-order methodology, have also been shown to enjoy optimal iteration complexity rates, while providing additional guarantees on computational cost. In this paper, we present an algorithm with favorable complexity properties that differs in two significant ways from other recently proposed methods. First, it is based on line searches only: Each step involves computation of a search direction, followed by a backtracking line search along that direction. Second, its analysis is rather straightforward, relying for the most part on the standard technique for demonstrating sufficient decrease in the objective from backtracking. In the latter part of the paper, we consider inexact computation of the search directions, using iterative methods in linear algebra: the conjugate gradient and Lanczos methods. We derive modified convergence and complexity results for these more practical methods.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An efficient improvement of the Newton method for solving nonconvex optimization problems

‎Newton method is one of the most famous numerical methods among the line search‎ ‎methods to minimize functions. ‎It is well known that the search direction and step length play important roles ‎in this class of methods to solve optimization problems. ‎In this investigation‎, ‎a new modification of the Newton method to solve ‎unconstrained optimization problems is presented‎. ‎The significant ...

متن کامل

On the Oracle Complexity of First-Order and Derivative-Free Algorithms for Smooth Nonconvex Minimization

The (optimal) function/gradient evaluations worst-case complexity analysis available for the Adaptive Regularizations algorithms with Cubics (ARC) for nonconvex smooth unconstrained optimization is extended to finite-difference versions of this algorithm, yielding complexity bounds for first-order and derivative free methods applied on the same problem class. A comparison with the results obtai...

متن کامل

Universal regularization methods – varying the power, the smoothness and the accuracy

Adaptive cubic regularization methods have emerged as a credible alternative to linesearch and trust-region for smooth nonconvex optimization, with optimal complexity amongst second-order methods. Here we consider a general/new class of adaptive regularization methods, that use firstor higher-order local Taylor models of the objective regularized by a(ny) power of the step size and applied to c...

متن کامل

A second-order globally convergent direct-search method and its worst-case complexity

Direct-search algorithms form one of the main classes of algorithms for smooth unconstrained derivative-free optimization, due to their simplicity and their well-established convergence results. They proceed by iteratively looking for improvement along some vectors or directions. In the presence of smoothness, first-order global convergence comes from the ability of the vectors to approximate t...

متن کامل

Simple Complexity Analysis of Direct Search

We consider the problem of unconstrained minimization of a smooth function in the derivativefree setting. In particular, we study the direct search method (of directional type). Despite relevant research activity spanning several decades, until recently no complexity guarantees— bounds on the number of function evaluations needed to find a satisfying point—for methods of this type were establis...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017